Nonlinear dimension reduction for conditional quantiles

نویسندگان

چکیده

In practice, data often display heteroscedasticity, making quantile regression (QR) a more appropriate methodology. Modeling the data, while maintaining flexible nonparametric fitting, requires smoothing over high-dimensional space which might not be feasible when number of predictor variables is large. This problem makes necessary use dimension reduction techniques for conditional quantiles, focus on extracting linear combinations without losing any information about quantile. However, nonlinear features can achieve greater reduction. We, therefore, present first extension algorithm estimating central subspace (CQS) using kernel data. First, we describe feature CQS within framework reproducing Hilbert space, and second, illustrate its performance through simulation examples real applications. Specifically, emphasize visualizing various aspects structure two extractors, highlight ability to combine proposed with classification algorithms. The results show that an effective tool performing quantiles.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Predicting Conditional Quantiles via Reduction to Classification

We show how to reduce the process of predicting conditional quantiles (and the median in particular) to solving classification. The accompanying theoretical statement shows that the regret of the classifier bounds the regret of the quantile regression under a quantile loss. We also test this reduction empirically against existing quantile regression methods on large real-world datasets and disc...

متن کامل

Nonlinear Dimension Reduction

A series of different data sets were used for testing eight different non-linear dimension reduction methods. The data sets provided insight to various ways of using the methods and to their applications. The results are compared for the different methods and reasons for their behaviour is searched for. Results are mostly quite encouraging and usable.

متن کامل

Approximating Conditional Distribution Functions Using Dimension Reduction

Motivated by applications to prediction and forecasting, we suggest methods for approximating the conditional distribution function of a random variable Y given a dependent random d-vector X. The idea is to estimate not the distribution of Y |X, but that of Y |θX, where the unit vector θ is selected so that the approximation is optimal under a least-squares criterion. We show that θ may be esti...

متن کامل

Nonparametric Prewhitening Estimators for Conditional Quantiles

We define a nonparametric prewhitening method for estimating conditional quantiles based on local linear quantile regression. We characterize the bias, variance and asymptotic normality of the proposed estimator. Under weak conditions our estimator can achieve bias reduction and have the same variance as the local linear quantile estimators. A small set of Monte Carlo simulations is carried out...

متن کامل

On M-estimators of Approximate Quantiles and Approximate Conditional Quantiles

M-estimators introduced in Huber (1964) provide a class of robust estimators of a center of symmetry of a symmetric probability distribution which also have very high eeciency at the model. However it is not clear what they do estimate when the probability distributions are nonsymmetric. In this paper we rst show that in the case of arbitrary, not necessarily symmetric probabilty distributions,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in data analysis and classification

سال: 2021

ISSN: ['1862-5355', '1862-5347']

DOI: https://doi.org/10.1007/s11634-021-00439-6